Speed Learning by Adaptive Skipping: Improving the Learning Rate of Artificial Neural Network through Adaptive Stochastic Sample Presentation
نویسنده
چکیده
The basic idea of this paper is to increase the learning rate of a artificial neural network without affecting the accuracy of the system. The new algorithms for dynamically reducing the number of input samples presented to the ANN (Artificial Neural Network) are given thus increasing the rate of learning. This method is called as Adaptive skipping. This can be used along with any supervised Learning Algorithms. The training phase is the most crucial and time consuming part of an ANN. The rate at which the ANN learns is the most considerable part. Among the factors affecting learning rate, the Size of the training set (no. of input samples used to train an ANN for a specific application) are considered and how the size of the training set affects the learning rate and accuracy of an ANN are discussed. The related works done in this
منابع مشابه
A Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network
Abstract Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...
متن کاملCystoscopy Image Classication Using Deep Convolutional Neural Networks
In the past three decades, the use of smart methods in medical diagnostic systems has attractedthe attention of many researchers. However, no smart activity has been provided in the eld ofmedical image processing for diagnosis of bladder cancer through cystoscopy images despite the highprevalence in the world. In this paper, two well-known convolutional neural networks (CNNs) ...
متن کاملDesigning stable neural identifier based on Lyapunov method
The stability of learning rate in neural network identifiers and controllers is one of the challenging issues which attracts great interest from researchers of neural networks. This paper suggests adaptive gradient descent algorithm with stable learning laws for modified dynamic neural network (MDNN) and studies the stability of this algorithm. Also, stable learning algorithm for parameters of ...
متن کاملINTEGRATED ADAPTIVE FUZZY CLUSTERING (IAFC) NEURAL NETWORKS USING FUZZY LEARNING RULES
The proposed IAFC neural networks have both stability and plasticity because theyuse a control structure similar to that of the ART-1(Adaptive Resonance Theory) neural network.The unsupervised IAFC neural network is the unsupervised neural network which uses the fuzzyleaky learning rule. This fuzzy leaky learning rule controls the updating amounts by fuzzymembership values. The supervised IAFC ...
متن کاملEAST: An Exponential Adaptive Skipping Training Algorithm for Multilayer Feedforward Neural Networks
Multilayer Feedforward Neural Network (MFNN) has been administered widely for solving a wide range of supervised pattern recognition tasks. The major problem in the MFNN training phase is its long training time especially when it is trained on very huge training datasets. In this accordance, an enhanced training algorithm called Exponential Adaptive Skipping Training (EAST) Algorithm is propose...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011